171 research outputs found

    The Effect of Network and Infrastructural Variables on SPDY's Performance

    Get PDF
    HTTP is a successful Internet technology on top of which a lot of the web resides. However, limitations with its current specification, i.e. HTTP/1.1, have encouraged some to look for the next generation of HTTP. In SPDY, Google has come up with such a proposal that has growing community acceptance, especially after being adopted by the IETF HTTPbis-WG as the basis for HTTP/2.0. SPDY has the potential to greatly improve web experience with little deployment overhead. However, we still lack an understanding of its true potential in different environments. This paper seeks to resolve these issues, offering a comprehensive evaluation of SPDY's performance using extensive experiments. We identify the impact of network characteristics and website infrastructure on SPDY's potential page loading benefits, finding that these factors are decisive for SPDY and its optimal deployment strategy. Through this, we feed into the wider debate regarding HTTP/2.0, exploring the key aspects that impact the performance of this future protocol

    Fine Grained Component Engineering of Adaptive Overlays: Experiences and Perspectives

    Get PDF
    Recent years have seen significant research being carried out into peer-to-peer (P2P) systems. This work has focused on the styles and applications of P2P computing, from grid computation to content distribution; however, little investigation has been performed into how these systems are built. Component based engineering is an approach that has seen successful deployment in the field of middleware development; functionality is encapsulated in ‘building blocks’ that can be dynamically plugged together to form complete systems. This allows efficient, flexible and adaptable systems to be built with lower overhead and development complexity. This paper presents an investigation into the potential of using component based engineering in the design and construction of peer-to-peer overlays. It is highlighted that the quality of these properties is dictated by the component architecture used to implement the system. Three reusable decomposition architectures are designed and evaluated using Chord and Pastry case studies. These demonstrate that significant improvements can be made over traditional design approaches resulting in much more reusable, (re)configurable and extensible systems

    Charting an intent driven network

    Get PDF
    The current strong divide between applications and the network control plane is desirable for many reasons; but a downside is that the network is kept in the dark regarding the ultimate purposes and intentions of applications and, as a result, is unable to optimize for these. An alternative approach, explored in this paper, is for applications to declare to the network their abstract intents and assumptions; e.g. "this is a Tweet", or "this application will run within a local domain". Such an enriched semantic has the potential to enable the network better to fulfill application intent, while also helping optimize network resource usage across applications. We refer to this approach as 'intent driven networking' (IDN), and we sketch an incrementally-deployable design to serve as a stepping stone towards a practical realization of the IDN concept within today's Internet

    Pythia: a Framework for the Automated Analysis of Web Hosting Environments

    Get PDF
    A common approach when setting up a website is to utilize third party Web hosting and content delivery networks. Without taking this trend into account, any measurement study inspecting the deployment and operation of websites can be heavily skewed. Unfortunately, the research community lacks generalizable tools that can be used to identify how and where a given website is hosted. Instead, a number of ad hoc techniques have emerged, e.g., using Autonomous System databases, domain prefixes for CNAME records. In this work we propose Pythia, a novel lightweight approach for identifying Web content hosted on third-party infrastructures, including both traditional Web hosts and content delivery networks. Our framework identifies the organization to which a given Web page belongs, and it detects which Web servers are self-hosted and which ones leverage third-party services to provide contents. To test our framework we run it on 40,000 URLs and evaluate its accuracy, both by comparing the results with similar services and with a manually validated groundtruth. Our tool achieves an accuracy of 90% and detects that under 11% of popular domains are self-hosted. We publicly release our tool to allow other researchers to reproduce our findings, and to apply it to their own studies

    Trollslayer: Crowdsourcing and Characterization of Abusive Birds in Twitter

    Full text link
    As of today, abuse is a pressing issue to participants and administrators of Online Social Networks (OSN). Abuse in Twitter can spawn from arguments generated for influencing outcomes of a political election, the use of bots to automatically spread misinformation, and generally speaking, activities that deny, disrupt, degrade or deceive other participants and, or the network. Given the difficulty in finding and accessing a large enough sample of abuse ground truth from the Twitter platform, we built and deployed a custom crawler that we use to judiciously collect a new dataset from the Twitter platform with the aim of characterizing the nature of abusive users, a.k.a abusive birds, in the wild. We provide a comprehensive set of features based on users' attributes, as well as social-graph metadata. The former includes metadata about the account itself, while the latter is computed from the social graph among the sender and the receiver of each message. Attribute-based features are useful to characterize user's accounts in OSN, while graph-based features can reveal the dynamics of information dissemination across the network. In particular, we derive the Jaccard index as a key feature to reveal the benign or malicious nature of directed messages in Twitter. To the best of our knowledge, we are the first to propose such a similarity metric to characterize abuse in Twitter.Comment: SNAMS 201

    Are People Really Social on Porn 2.0?

    Get PDF
    Social Web 2.0 features have become a vital component in a variety of multimedia systems, e.g., Last.fm, Flickr and Spotify. Interestingly, adult video websites are also starting to adopt these Web 2.0 principles, giving rise to the term ``Porn 2.0''. This paper examines a large Porn 2.0 social network, through data covering 563k users. We explore a number of unusual behavioural aspects that set this apart from more traditional multimedia social networks, including differences in browsing activity, social communications and relationship creation. We also analyse the nature and behaviour of content sharing, highlighting the role it plays in the Porn 2.0 community, as well as the preferences that users have when deciding what to consume. We particularly explore the impact that gender and sexuality have on these issues, showing their vital importance for aspects such as profile popularity
    • …
    corecore